Fast Learning Rate of Multiple Kernel Learning: Trade-Off between Sparsity and Smoothness
نویسندگان
چکیده
We investigate the learning rate of multiple kernel leaning (MKL) with l1 and elastic-net regularizations. The elastic-net regularization is a composition of an l1-regularizer for inducing the sparsity and an l2-regularizer for controlling the smoothness. We focus on a sparse setting where the total number of kernels is large but the number of non-zero components of the ground truth is relatively small, and show sharper convergence rates than the learning rates ever shown for both l1 and elastic-net regularizations. Our analysis shows there appears a trade-off between the sparsity and the smoothness when it comes to selecting which of l1 and elastic-net regularizations to use; if the ground truth is smooth, the elastic-net regularization is preferred, otherwise the l1 regularization is preferred.
منابع مشابه
Adaptive Control of a Class of Nonlinear Discrete-Time Systems with Online Kernel Learning
An Online Kernel Learning based Adaptive Control (OKL-AC) framework for discrete-time affine nonlinear systems is presented in this paper. A sparsity strategy is proposed to control the complexity of OKL identification model, meanwhile to make a trade-off between the demanded tracking precision and the complexity of the control law. The forward increasing and backward decreasing learning stages...
متن کاملNon-parametric Group Orthogonal Matching Pursuit for Sparse Learning with Multiple Kernels
We consider regularized risk minimization in a large dictionary of Reproducing kernel Hilbert Spaces (RKHSs) over which the target function has a sparse representation. This setting, commonly referred to as Sparse Multiple Kernel Learning (MKL), may be viewed as the non-parametric extension of group sparsity in linear models. While the two dominant algorithmic strands of sparse learning, namely...
متن کاملیادگیری نیمه نظارتی کرنل مرکب با استفاده از تکنیکهای یادگیری معیار فاصله
Distance metric has a key role in many machine learning and computer vision algorithms so that choosing an appropriate distance metric has a direct effect on the performance of such algorithms. Recently, distance metric learning using labeled data or other available supervisory information has become a very active research area in machine learning applications. Studies in this area have shown t...
متن کاملUltra-Fast Optimization Algorithm for Sparse Multi Kernel Learning
Many state-of-the-art approaches for Multi Kernel Learning (MKL) struggle at finding a compromise between performance, sparsity of the solution and speed of the optimization process. In this paper we look at the MKL problem at the same time from a learning and optimization point of view. So, instead of designing a regularizer and then struggling to find an efficient method to minimize it, we de...
متن کاملFeature selection for nonlinear models with extreme learning machines
In the context of feature selection, there is a trade-off between the number of selected features and the generalisation error. Two plots may help to summarise feature selection: the feature selection path and the sparsity-error trade-off curve. The feature selection path shows the best feature subset for each subset size, whereas the sparsity-error trade-off curve shows the corresponding gener...
متن کامل